30 research outputs found

    Collision Helps - Algebraic Collision Recovery for Wireless Erasure Networks

    Full text link
    Current medium access control mechanisms are based on collision avoidance and collided packets are discarded. The recent work on ZigZag decoding departs from this approach by recovering the original packets from multiple collisions. In this paper, we present an algebraic representation of collisions which allows us to view each collision as a linear combination of the original packets. The transmitted, colliding packets may themselves be a coded version of the original packets. We propose a new acknowledgment (ACK) mechanism for collisions based on the idea that if a set of packets collide, the receiver can afford to ACK exactly one of them and still decode all the packets eventually. We analytically compare delay and throughput performance of such collision recovery schemes with other collision avoidance approaches in the context of a single hop wireless erasure network. In the multiple receiver case, the broadcast constraint calls for combining collision recovery methods with network coding across packets at the sender. From the delay perspective, our scheme, without any coordination, outperforms not only a ALOHA-type random access mechanisms, but also centralized scheduling. For the case of streaming arrivals, we propose a priority-based ACK mechanism and show that its stability region coincides with the cut-set bound of the packet erasure network

    On Resource Allocation in Fading Multiple Access Channels - An Efficient Approximate Projection Approach

    Full text link
    We consider the problem of rate and power allocation in a multiple-access channel. Our objective is to obtain rate and power allocation policies that maximize a general concave utility function of average transmission rates on the information theoretic capacity region of the multiple-access channel. Our policies does not require queue-length information. We consider several different scenarios. First, we address the utility maximization problem in a nonfading channel to obtain the optimal operating rates, and present an iterative gradient projection algorithm that uses approximate projection. By exploiting the polymatroid structure of the capacity region, we show that the approximate projection can be implemented in time polynomial in the number of users. Second, we consider resource allocation in a fading channel. Optimal rate and power allocation policies are presented for the case that power control is possible and channel statistics are available. For the case that transmission power is fixed and channel statistics are unknown, we propose a greedy rate allocation policy and provide bounds on the performance difference of this policy and the optimal policy in terms of channel variations and structure of the utility function. We present numerical results that demonstrate superior convergence rate performance for the greedy policy compared to queue-length based policies. In order to reduce the computational complexity of the greedy policy, we present approximate rate allocation policies which track the greedy policy within a certain neighborhood that is characterized in terms of the speed of fading.Comment: 32 pages, Submitted to IEEE Trans. on Information Theor

    Avoiding Interruptions - QoE Trade-offs in Block-coded Streaming Media Applications

    Get PDF
    We take an analytical approach to study Quality of user Experience (QoE) for video streaming applications. First, we show that random linear network coding applied to blocks of video frames can significantly simplify the packet requests at the network layer and save resources by avoiding duplicate packet reception. Network coding allows us to model the receiver's buffer as a queue with Poisson arrivals and deterministic departures. We consider the probability of interruption in video playback as well as the number of initially buffered packets (initial waiting time) as the QoE metrics. We characterize the optimal trade-off between these metrics by providing upper and lower bounds on the minimum initial buffer size, required to achieve certain level of interruption probability for different regimes of the system parameters. Our bounds are asymptotically tight as the file size goes to infinity.Comment: Submitted to ISIT 2010 - Full versio

    ‘Codes are not enough…’: a report of ongoing research

    Get PDF
    We consider the problem of rate allocation in a fading Gaussian multiple-access channel (MAC) with fixed transmission powers. Our goal is to maximize a general concave utility function of transmission rates over the throughput capacity region. In contrast to earlier works in this context that propose solutions where a potentially complex optimization problem must be solved in every decision instant, we propose a low-complexity approximate rate allocation policy and analyze the effect of temporal channel variations on its utility performance. To the best of our knowledge, this is the first work that studies the tracking capabilities of an approximate rate allocation scheme under fading channel conditions. We build on an earlier work to present a new rate allocation policy for a fading MAC that implements a low-complexity approximate gradient projection iteration for each channel measurement, and explicitly characterize the effect of the speed of temporal channel variations on the tracking neighborhood of our policy. We further improve our results by proposing an alternative rate allocation policy for which tighter bounds on the size of the tracking neighborhood are derived. These proposed rate allocation policies are computationally efficient in our setting since they implement a single gradient projection iteration per channel measurement and each such iteration relies on approximate projections which has polynomial-complexity in the number of users.Comment: 9 pages, In proc. of ITA 200

    Access-Network Association Policies for Media Streaming in Heterogeneous Environments

    Full text link
    We study the design of media streaming applications in the presence of multiple heterogeneous wireless access methods with different throughputs and costs. Our objective is to analytically characterize the trade-off between the usage cost and the Quality of user Experience (QoE), which is represented by the probability of interruption in media playback and the initial waiting time. We model each access network as a server that provides packets to the user according to a Poisson process with a certain rate and cost. Blocks are coded using random linear codes to alleviate the duplicate packet reception problem. Users must take decisions on how many packets to buffer before playout, and which networks to access during playout. We design, analyze and compare several control policies with a threshold structure. We formulate the problem of finding the optimal control policy as an MDP with a probabilistic constraint. We present the HJB equation for this problem by expanding the state space, and exploit it as a verification method for optimality of the proposed control law.Comment: submitted to CDC 201

    Optimal Reverse Carpooling Over Wireless Networks - A Distributed Optimization Approach

    Full text link
    We focus on a particular form of network coding, reverse carpooling, in a wireless network where the potentially coded transmitted messages are to be decoded immediately upon reception. The network is fixed and known, and the system performance is measured in terms of the number of wireless broadcasts required to meet multiple unicast demands. Motivated by the structure of the coding scheme, we formulate the problem as a linear program by introducing a flow variable for each triple of connected nodes. This allows us to have a formulation polynomial in the number of nodes. Using dual decomposition and projected subgradient method, we present a decentralized algorithm to obtain optimal routing schemes in presence of coding opportunities. We show that the primal sub-problem can be expressed as a shortest path problem on an \emph{edge-graph}, and the proposed algorithm requires each node to exchange information only with its neighbors.Comment: submitted to CISS 201

    Metrics, fundamental trade-offs and control policies for delay-sensitive applications in volatile environments

    Get PDF
    Thesis (Ph. D.)--Massachusetts Institute of Technology, Dept. of Electrical Engineering and Computer Science, 2012.Cataloged from PDF version of thesis.Includes bibliographical references (p. 137-142).With the explosion of consumer demand, media streaming will soon be the dominant type of Internet traffic. Since such applications are intrinsically delay-sensitive, the conventional network control policies and coding algorithms may not be appropriate tools for data dissemination over networks. The major issue with design and analysis of delay-sensitive applications is the notion of delay, which significantly varies across different applications and time scales. We present a framework for studying the problem of media streaming in an unreliable environment. The focus of this work is on end-user experience for such applications. First, we take an analytical approach to study fundamental rate-delay-reliability trade-offs in the context of media streaming for a single receiver system. We consider the probability of interruption in media playback (buffer underflow) as well as the number of initially buffered packets (initial waiting time) as the Quality of user Experience (QoE) metrics. We characterize the optimal trade-off between these metrics as a function of system parameters such as the packet arrival rate and the file size, for different channel models. For a memoryless channel, we model the receiver's queue dynamics as an M/D/1 queue. Then, we show that for arrival rates slightly larger than the play rate, the minimum initial buffering required to achieve certain level of interruption probability remains bounded as the file size grows. For the case where the arrival rate and the play rate match, the minimum initial buffer size should scale as the square root of the file size. We also study media streaming over channels with memory, modeled using Markovian arrival processes. We characterize the optimal trade-off curves for the infinite file size case, in such Markovian environments. Second, we generalize the results to the case of multiple servers or peers streaming to a single receiver. Random linear network coding allows us to simplify the packet selection strategies and alleviate issues such as duplicate packet reception. We show that the multi-server streaming problem over a memoryless channel can be transformed into a single-server streaming problem, for which we have characterized QoE trade-offs. Third, we study the design of media streaming applications in the presence of multiple heterogeneous wireless access methods with different access costs. Our objective is to analytically characterize the trade-off between usage cost and QoE metrics. We model each access network as a server that provides packets to the user according to a Poisson process with a certain rate and cost. User must make a decision on how many packets to buffer before playback, and which networks to access during the playback. We design, analyze and compare several control policies. In particular, we show that a simple Markov policy with a threshold structure performs the best. We formulate the problem of finding the optimal control policy as a Markov Decision Process (MDP) with a probabilistic constraint. We present the Hamilton-Jacobi-Bellman (HJB) equation for this problem by expanding the state space, and exploit it as a verification method for optimality of the proposed control policy. We use the tools and techniques developed for media streaming applications in the context of power supply networks. We study the value of storage in securing reliability of a system with uncertain supply and demand, and supply friction. We assume storage, when available, can be used to compensate, fully or partially, for the surge in demand or loss of supply. We formulate the problem of optimal utilization of storage with the objective of maximizing system reliability as minimization of the expected discounted cost of blackouts over an infinite horizon. We show that when the stage cost is linear in the size of the blackout, the optimal policy is myopic in the sense that all shocks are compensated by storage up to the available level of storage. However, when the stage cost is strictly convex, it may be optimal to curtail some of the demand and allow a small current blackout in the interest of maintaining a higher level of reserve to avoid a large blackout in the future. Finally, we examine the value of storage capacity in improving system's reliability, as well as the effects of the associated optimal policies under different stage costs on the probability distribution of blackouts.by Ali ParandehGheibi.Ph.D
    corecore